Bagging and Boosting performance in Projection Pursuit Regression

نویسندگان

  • Simone Borra
  • Agostino Di Ciaccio
چکیده

Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers on artificial and real data sets. The goal is to assemble a collection of individual classifiers based on resampling of data set. Bagging (Breiman, 1996) and AdaBoost (Freund & Schapire, 1997) are the most used procedures: the first fits many classifiers to bootstrap samples of data and classifies the units by the majority vote; the second fits many classifiers weighing the data and classifies the units by the weighted majority vote. The success of these methods is in terms of the bias-variance components of the generalization error. In the regression context the application of these techniques has received little investigation. Drucker (1997) and other authors modified AdaBoost to use regression trees as predictors. Our aim is to verify by simulations if boosting or bagging can improve the capability to reduce conjointly training set error and generalization error using Projection Pursuit Regressions as predictors.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving Regressors using Boosting Techniques

In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all cases, boosting is at least equivalent, and...

متن کامل

Improving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran

An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...

متن کامل

Boosting and Bagging of Neural Networks with Applications to Financial Time Series

Boosting and bagging are two techniques for improving the performance of learning algorithms. Both techniques have been successfully used in machine learning to improve the performance of classification algorithms such as decision trees, neural networks. In this paper, we focus on the use of feedforward back propagation neural networks for time series classification problems. We apply boosting ...

متن کامل

Combining Bagging and Additive Regression

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in t...

متن کامل

L1 regularized projection pursuit for additive model learning

In this paper, we present a L1 regularized projection pursuit algorithm for additive model learning. Two new algorithms are developed for regression and classification respectively: sparse projection pursuit regression and sparse Jensen-Shannon Boosting. The introduced L1 regularized projection pursuit encourages sparse solutions, thus our new algorithms are robust to overfitting and present be...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999